perm filename AB.TO[P,JRA] blob
sn#164448 filedate 1975-06-17 generic text, type C, neo UTF8
COMMENT ā VALID 00002 PAGES
C REC PAGE DESCRIPTION
C00001 00001
C00002 00002 moron other work.
C00008 ENDMK
Cā;
moron other work.
The C.B. Jones and C.D. allen sequence of papers are formal
paper-and-pencil developments of complex algorithms. They are done in
a stepwise manner (indeed some are done in VDL, a LISP-like language)
with reasonably formal proofs supplied for each level of elaboration.
The detail of the proofs and the style of the arguments are almost
mechanizable a la FOL. That is the style could become the basis for
aproof component of this system so that we could resort to a FOL-like
program to "proof-check" the partially specified algorithms.
The Snowdon system (PEARL) was an attempt to build a programming
environment to support stwpwise refinement. It was pretty fucked up
(they weren't too bright) but it started me thinking about how things
should be done.
immediate "verification" which system can do almost for free
1) consistency checking of data types and "arity" of function definition-call
structure.
Many errors which LISP programmers make in large programs particularly
are "data type" errors. Since LISP really only has two types
--sexpr and list-- and no checking even on these, type errors can proliferate
until the finaly get caught by something like car of an atom (for example).
The best way to rid programs of this kind of error is to enforce some kind of
typing as the high level abstract LISP program is specified. It doesn't cost
much to implement and can save a lot of grief later. The typing need
not slow down the running program too much; either they can be optimized out
by a clever translator or macro expansion could at least alleviate much
of the overhead.
calling with the wrong number of argument is usually caught by LISP when
the program runs, but certainly could be caught earlier at specification time
with an appropriate increase in the efficiency of the programmer's time.
The area of "real" verification is wide open. I've run some experiments
--see formalisms and included section on unify-- and have some other ideas.
The Hoare ideas, but not necessarily the formalism, are quite useful I
believe.
By the way, say away from Luckham as a reference. Since I quit working for
him he has been a complete shit.
Related work to verification is program transformation aLa John Darlington
and changes of data representation ala Jim Low.
Darlington has written a system which takes recursion equations in
a style very similar to that of what I'm proposing and automatically
transforms them to more efficient but equivalent programs.
removes recursion, rewrites loops, minimizes storage useage, etc.
Some of his tools should be used as program-transformation commands.
Low's stuff you know about.
Ha, Boyer-Moore theorem prover, is also applicable area.
(boyer or moore would be sympathetic to this scheme, by the way)
Their prover (running in lisp) can do a reasonable job of proving
properites of LISP programs (ihave some ideas on how to make their scheme
better, by the way). Using command to a boyer-moore prover, the student
can verify is intuitions about parts and peices of his program.
like reverse[reverse[x]] = x or append... reverse chestnut.